623 research outputs found

    REGULAR RATE AND THE BAY RIDGE CASE: A GUIDE TO LEGISLATIVE REVISION

    Get PDF

    Mickey Mantle: An American Legend

    Get PDF

    Collaborative applications used in a wireless environment at sea for use in Coast Guard Law Enforcement and Homeland Security missions

    Get PDF
    This thesis analyzes the potential impact of incorporating wireless technologies, specifically an 802.11 mesh layer architecture and 802.16 Orthogonal Frequency Division Multiplexing, in order to effectively and more efficiently transmit data and create a symbiotic operational picture between Coast Guard Cutters, their boarding teams, Coast Guard Operation Centers, and various external agencies. Two distinct collaborative software programs, Groove Virtual Office and the Naval Postgraduate School's Situational Awareness Agent, are utilized over the Tactical Mesh and OFDM network configurations to improve the Common Operating Picture of involved units within a marine environment to evaluate their potential impact for the Coast Guard. This is being done to increase the effectiveness and efficiency of Coast Guard units while they carry out their Law Enforcement and Homeland Security Missions. Through multiple field experiments, including Tactical Network Topology and nuclear component sensing with Lawrence Livermore National Laboratory, we utilize commercial off the shelf (COTS) equipment and software to evaluate their impact on these missions.http://archive.org/details/collaborativeppl109452311Lieutenant Commander, United States Coast GuardLieutenant, United States Coast GuardApproved for public release; distribution is unlimited

    Error, reproducibility and sensitivity : a pipeline for data processing of Agilent oligonucleotide expression arrays

    Get PDF
    Background Expression microarrays are increasingly used to obtain large scale transcriptomic information on a wide range of biological samples. Nevertheless, there is still much debate on the best ways to process data, to design experiments and analyse the output. Furthermore, many of the more sophisticated mathematical approaches to data analysis in the literature remain inaccessible to much of the biological research community. In this study we examine ways of extracting and analysing a large data set obtained using the Agilent long oligonucleotide transcriptomics platform, applied to a set of human macrophage and dendritic cell samples. Results We describe and validate a series of data extraction, transformation and normalisation steps which are implemented via a new R function. Analysis of replicate normalised reference data demonstrate that intrarray variability is small (only around 2% of the mean log signal), while interarray variability from replicate array measurements has a standard deviation (SD) of around 0.5 log2 units ( 6% of mean). The common practise of working with ratios of Cy5/Cy3 signal offers little further improvement in terms of reducing error. Comparison to expression data obtained using Arabidopsis samples demonstrates that the large number of genes in each sample showing a low level of transcription reflect the real complexity of the cellular transcriptome. Multidimensional scaling is used to show that the processed data identifies an underlying structure which reflect some of the key biological variables which define the data set. This structure is robust, allowing reliable comparison of samples collected over a number of years and collected by a variety of operators. Conclusions This study outlines a robust and easily implemented pipeline for extracting, transforming normalising and visualising transcriptomic array data from Agilent expression platform. The analysis is used to obtain quantitative estimates of the SD arising from experimental (non biological) intra- and interarray variability, and for a lower threshold for determining whether an individual gene is expressed. The study provides a reliable basis for further more extensive studies of the systems biology of eukaryotic cells

    Laser Transmitter Design and Performance for the Slope Imaging Multi-Polarization Photon-Counting Lidar (SIMPL) Instrument

    Get PDF
    The Slope Imaging Multi-polarization Photon-counting Lidar (SIMPL) instrument is a polarimetric, two-color, multibeam push broom laser altimeter developed through the NASA Earth Science Technology Office Instrument Incubator Program and has been flown successfully on multiple airborne platforms since 2008. In this talk we will discuss the laser transmitter performance and present recent science data collected over the Greenland ice sheet and sea ice in support of the NASA Ice Cloud and land Elevation Satellite 2 (ICESat-2) mission to be launched in 2017

    Performance Considerations for the SIMPL Single Photon, Polarimetric, Two-Color Laser Altimeter as Applied to Measurements of Forest Canopy Structure and Composition

    Get PDF
    The Slope Imaging Multi-polarization Photon-counting Lidar (SIMPL) is a multi-beam, micropulse airborne laser altimeter that acquires active and passive polarimetric optical remote sensing measurements at visible and near-infrared wavelengths. SIMPL was developed to demonstrate advanced measurement approaches of potential benefit for improved, more efficient spaceflight laser altimeter missions. SIMPL data have been acquired for wide diversity of forest types in the summers of 2010 and 2011 in order to assess the potential of its novel capabilities for characterization of vegetation structure and composition. On each of its four beams SIMPL provides highly-resolved measurements of forest canopy structure by detecting single-photons with 15 cm ranging precision using a narrow-beam system operating at a laser repetition rate of 11 kHz. Associated with that ranging data SIMPL provides eight amplitude parameters per beam unlike the single amplitude provided by typical laser altimeters. Those eight parameters are received energy that is parallel and perpendicular to that of the plane-polarized transmit pulse at 532 nm (green) and 1064 nm (near IR), for both the active laser backscatter retro-reflectance and the passive solar bi-directional reflectance. This poster presentation will cover the instrument architecture and highlight the performance of the SIMPL instrument with examples taken from measurements for several sites with distinct canopy structures and compositions. Specific performance areas such as probability of detection, after pulsing, and dead time, will be highlighted and addressed, along with examples of their impact on the measurements and how they limit the ability to accurately model and recover the canopy properties. To assess the sensitivity of SIMPL's measurements to canopy properties an instrument model has been implemented in the FLIGHT radiative transfer code, based on Monte Carlo simulation of photon transport. SIMPL data collected in 2010 over the Smithsonian Environmental Research Center, MD are currently being modelled and compared to other remote sensing and in situ data sets. Results on the adaptation of FLIGHT to model micropulse, single'photon ranging measurements are presented elsewhere at this conference. NASA's ICESat-2 spaceflight mission, scheduled for launch in 2016, will utilize a multi-beam, micropulse, single-photon ranging measurement approach (although non-polarimetric and only at 532 nm). Insights gained from the analysis and modelling of SIMPL data will help guide preparations for that mission, including development of calibration/validation plans and algorithms for the estimation of forest biophysical parameters

    Optimality Driven Nearest Centroid Classification from Genomic Data

    Get PDF
    Nearest-centroid classifiers have recently been successfully employed in high-dimensional applications, such as in genomics. A necessary step when building a classifier for high-dimensional data is feature selection. Feature selection is frequently carried out by computing univariate scores for each feature individually, without consideration for how a subset of features performs as a whole. We introduce a new feature selection approach for high-dimensional nearest centroid classifiers that instead is based on the theoretically optimal choice of a given number of features, which we determine directly here. This allows us to develop a new greedy algorithm to estimate this optimal nearest-centroid classifier with a given number of features. In addition, whereas the centroids are usually formed from maximum likelihood estimates, we investigate the applicability of high-dimensional shrinkage estimates of centroids. We apply the proposed method to clinical classification based on gene-expression microarrays, demonstrating that the proposed method can outperform existing nearest centroid classifiers
    • …
    corecore